Discussion of “ Boosting Algorithms : Regularization , Prediction and Model Fitting ” by Peter Bühlmann and Torsten Hothorn

نویسندگان

  • Peter Bühlmann
  • Torsten Hothorn
  • Trevor Hastie
چکیده

We congratulate the authors (hereafter BH) for an interesting take on the boosting technology, and for developing a modular computational environment in R for exploring their models. Their use of low-degree-of-freedom smoothing splines as a base learner provides an interesting approach to adaptive additive modeling. The notion of “Twin Boosting” is interesting as well; besides the adaptive lasso, we have seen the idea applied more directly for the lasso and Dantzig selector (James, Radchenko & Lv 2007). In this discussion we elaborate on the connections between L2-boosting of a linear model and infinitesimal forward stagewise linear regression (Section 5.2.1 in BH). We then take the authors to task on their definition of degrees of freedom (Section 5.3 of BH).

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

BOOSTING ALGORITHMS : REGULARIZATION , PREDICTION AND MODEL FITTING By Peter Bühlmann and Torsten Hothorn

We present a statistical perspective on boosting. Special emphasis is given to estimating potentially complex parametric or nonparametric models, including generalized linear and additive models as well as regression models for survival analysis. Concepts of degrees of freedom and corresponding Akaike or Bayesian information criteria, particularly useful for regularization and variable selectio...

متن کامل

Boosting Algorithms: Regularization, Prediction and Model Fitting

We present a statistical perspective on boosting. Special emphasis is given to estimating potentially complex parametric or nonparametric models, including generalized linear and additive models as well as regression models for survival analysis. Concepts of degrees of freedom and corresponding Akaike or Bayesian information criteria, particularly useful for regularization and variable selectio...

متن کامل

Rejoinder: Boosting Algorithms: Regularization, Prediction and Model Fitting

We are grateful that Hastie points out the connection to degrees of freedom for LARS which leads to another—and often better—definition of degrees of freedom for boosting in generalized linear models. As Hastie writes and as we said in the paper, our formula for degrees of freedom is only an approximation: the cost of searching, for example, for the best variable in componentwise linear least s...

متن کامل

Model-based Boosting 2.0 Model-based Boosting 2.0

This is an extended version of the manuscript Torsten Hothorn, Peter Bühlmann, Thomas Kneib, Mattthias Schmid and Benjamin Hofner (2010), Model-based Boosting 2.0. Journal of Machine Learning Research, 11, 2109 – 2113; http://jmlr.csail.mit.edu/papers/v11/hothorn10a.html. We describe version 2.0 of the R add-on package mboost. The package implements boosting for optimizing general risk function...

متن کامل

Model-based Boosting 2.0

We describe version 2.0 of the R add-on package mboost. The package implements boosting for optimizing general risk functions using component-wise (penalized) least squares estimates or regression trees as base-learners for fitting generalized linear, additive and interaction models to potentially high-dimensional data.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007